🇬🇧 en fi 🇫🇮

Markov chain noun

  • (probability theory) A discrete-time stochastic process containing a Markov property.
Markovin ketju
Wiktionary Links